Minimum Bayes error feature selection

نویسندگان

  • George Saon
  • Mukund Padmanabhan
چکیده

We consider the problem of designing a linear transformation 2 IR , of rank p n, which projects the features of a classi er x 2 IR onto y = x 2 IR such as to achieve minimum Bayes error (or probability of misclassi cation). Two avenues will be explored: the rst is to maximize the -average divergence between the class densities and the second is to minimize the union Bhattacharyya bound in the range of . While both approaches yield similar performance in practice, they outperform standard LDA features and show a 10% relative improvement in the word error rate over state-of-the-art cepstral features on a large vocabulary telephony speech recognition task.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Approach for Text Documents Classification with Invasive Weed Optimization and Naive Bayes Classifier

With the fast increase of the documents, using Text Document Classification (TDC) methods has become a crucial matter. This paper presented a hybrid model of Invasive Weed Optimization (IWO) and Naive Bayes (NB) classifier (IWO-NB) for Feature Selection (FS) in order to reduce the big size of features space in TDC. TDC includes different actions such as text processing, feature extraction, form...

متن کامل

Feature Selection by Maximum Marginal Diversity

We address the question of feature selection in the context of visual recognition. It is shown that, besides efficient from a computational standpoint, the infomax principle is nearly optimal in the minimum Bayes error sense. The concept of marginal diversity is introduced, leading to a generic principle for feature selection (the principle of maximum marginal diversity) of extreme computationa...

متن کامل

Minimum Bayes Error Feature Selection for Continuous Speech Recognition

We consider the problem of designing a linear transformation , of rank , which projects the features of a classifier onto such as to achieve minimum Bayes error (or probability of misclassification). Two avenues will be explored: the first is to maximize the -average divergence between the class densities and the second is to minimize the union Bhattacharyya bound in the range of . While both a...

متن کامل

Discriminative Clustering Based Feature Selection and Nonparametric Bayes Error Minimization and Support Vector Machines (SVMs)

In recent years feature selection is an eminent task in knowledge discovery database (KDD) that selects appropriate features from massive amount of high-dimensional data. In an attempt to establish theoretical justification for feature selection algorithms, this work presents a theoretical optimal criterion, specifically, the discriminative optimal criterion (DoC) for feature selection. Computa...

متن کامل

Canadian Robotic Vision Special Issue Minimum Bayes Error Features for Visual Recognition

The design of optimal feature sets for visual classification problems is still one of the most challenging topics in the area of computer vision. In this work, we propose a new algorithm that computes optimal features, in the minimum Bayes error sense, for visual recognition tasks. The algorithm now proposed combines the fast convergence rate of feature selection (FS) procedures with the abilit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000